2,299 research outputs found
Low dimensional behavior in three-dimensional coupled map lattices
The analysis of one-, two-, and three-dimensional coupled map lattices is
here developed under a statistical and dynamical perspective. We show that the
three-dimensional CML exhibits low dimensional behavior with long range
correlation and the power spectrum follows noise. This approach leads to
an integrated understanding of the most important properties of these universal
models of spatiotemporal chaos. We perform a complete time series analysis of
the model and investigate the dependence of the signal properties by change of
dimension.Comment: 7 pages, 6 figures (revised
Complexity-entropy causality plane: a useful approach for distinguishing songs
Nowadays we are often faced with huge databases resulting from the rapid
growth of data storage technologies. This is particularly true when dealing
with music databases. In this context, it is essential to have techniques and
tools able to discriminate properties from these massive sets. In this work, we
report on a statistical analysis of more than ten thousand songs aiming to
obtain a complexity hierarchy. Our approach is based on the estimation of the
permutation entropy combined with an intensive complexity measure, building up
the complexity-entropy causality plane. The results obtained indicate that this
representation space is very promising to discriminate songs as well as to
allow a relative quantitative comparison among songs. Additionally, we believe
that the here-reported method may be applied in practical situations since it
is simple, robust and has a fast numerical implementation.Comment: Accepted for publication in Physica
Effects of coarse-graining on the scaling behavior of long-range correlated and anti-correlated signals
We investigate how various coarse-graining methods affect the scaling
properties of long-range power-law correlated and anti-correlated signals,
quantified by the detrended fluctuation analysis. Specifically, for
coarse-graining in the magnitude of a signal, we consider (i) the Floor, (ii)
the Symmetry and (iii) the Centro-Symmetry coarse-graining methods. We find,
that for anti-correlated signals coarse-graining in the magnitude leads to a
crossover to random behavior at large scales, and that with increasing the
width of the coarse-graining partition interval this crossover moves
to intermediate and small scales. In contrast, the scaling of positively
correlated signals is less affected by the coarse-graining, with no observable
changes when a crossover appears at small
scales and moves to intermediate and large scales with increasing . For
very rough coarse-graining () based on the Floor and Symmetry
methods, the position of the crossover stabilizes, in contrast to the
Centro-Symmetry method where the crossover continuously moves across scales and
leads to a random behavior at all scales, thus indicating a much stronger
effect of the Centro-Symmetry compared to the Floor and the Symmetry methods.
For coarse-graining in time, where data points are averaged in non-overlapping
time windows, we find that the scaling for both anti-correlated and positively
correlated signals is practically preserved. The results of our simulations are
useful for the correct interpretation of the correlation and scaling properties
of symbolic sequences.Comment: 19 pages, 13 figure
A family of membrane-shaping proteins at ER subdomains regulates pre-peroxisomal vesicle biogenesis
Saccharomyces cerevisiae contains three conserved reticulon and reticulon-like proteins that help maintain ER structure by stabilizing high membrane curvature in ER tubules and the edges of ER sheets. A mutant lacking all three proteins has dramatically altered ER morphology. We found that ER shape is restored in this mutant when Pex30p or its homologue Pex31p is overexpressed. Pex30p can tubulate membranes both in cells and when reconstituted into proteoliposomes, indicating that Pex30p is a novel ER-shaping protein. In contrast to the reticulons, Pex30p is low abundance, and we found that it localizes to subdomains in the ER. We show that these ER subdomains are the sites where most preperoxisomal vesicles (PPVs) are generated. In addition, overproduction or deletion of Pex30p or Pex31p alters the size, shape, and number of PPVs. Our findings suggest that Pex30p and Pex31p help shape and generate regions of the ER where PPV biogenesis occurs
Dynamics of conflicts in Wikipedia
In this work we study the dynamical features of editorial wars in Wikipedia
(WP). Based on our previously established algorithm, we build up samples of
controversial and peaceful articles and analyze the temporal characteristics of
the activity in these samples. On short time scales, we show that there is a
clear correspondence between conflict and burstiness of activity patterns, and
that memory effects play an important role in controversies. On long time
scales, we identify three distinct developmental patterns for the overall
behavior of the articles. We are able to distinguish cases eventually leading
to consensus from those cases where a compromise is far from achievable.
Finally, we analyze discussion networks and conclude that edit wars are mainly
fought by few editors only.Comment: Supporting information adde
Is there a vortex-glass transition in high-temperature superconductors?
We show that DC voltage versus current measurements of a YBCO micro-bridge in
a magnetic field can be collapsed onto scaling functions proposed by Fisher,
Fisher, and Huse, as is widely reported in the literature. We find, however,
that good data collapse is achieved for a wide range of critical exponents and
temperatures. These results strongly suggest that agreement with scaling alone
does not prove the existence of a phase transition. We propose a criterion to
determine if the data collapse is valid, and thus if a phase transition occurs.
To our knowledge, none of the data reported in the literature meet our
criterion.Comment: 4 pages, 4 figure
Supernova Type Ia progenitors from merging double white dwarfs: Using a new population synthesis model
The study of Type Ia supernovae (SNIa) has lead to greatly improved insights
into many fields in astrophysics, however a theoretical explanation of the
origin of these events is still lacking. We investigate the potential
contribution to the SNIa rate from the population of merging double
carbon-oxygen white dwarfs. We aim to develope a model that fits the observed
SNIa progenitors as well as the observed close double white dwarf population.
We differentiate between two scenarios for the common envelope (CE) evolution;
the alpha-formalism based on the energy equation and the gamma-formalism that
is based on the angular momentum equation. In one model we apply the
alpha-formalism always. In the second model the gamma-formalism is applied,
unless the binary contains a compact object or the CE is triggered by a tidal
instability for which the alpha-formalism is used. The binary population
synthesis code SeBa was used to evolve binary systems from the zero-age main
sequence to the formation of double white dwarfs and subsequent mergers. SeBa
has been thoroughly updated since the last publication of the content of the
code. The limited sample of observed double white dwarfs is better represented
by the simulated population using the gamma-formalism than the alpha-formalism.
For both CE formalisms, we find that although the morphology of the simulated
delay time distribution matches that of the observations within the errors, the
normalisation and time-integrated rate per stellar mass are a factor 7-12 lower
than observed. Furthermore, the characteristics of the simulated populations of
merging double carbon-oxygen white dwarfs are discussed and put in the context
of alternative SNIa models for merging double white dwarfs.Comment: 16 pages (including 4 pages appendix), 15 figure
Laminar flame characteristics of natural gas and dissociated methanol mixtures diluted by nitrogen
The effect of dissociated methanol (H2:CO=2:1 by volume) on laminar burning velocity of natural gas (methane as the main component) was studied by using a constant volume bomb (CVB). Nitrogen, as diluent gas, was added into the natural gas (CH4) - dissociated methanol (DM) mixtures to investigate the dilution effect. Experiments were conducted at initial temperature of 343 K and initial pressure of 0.3 MPa with equivalence ratios from 0.8 to 1.4. Laminar burning velocities were calculated through Schlieren photographs, correlation of in-cylinder pressure data and Chemkin-Pro. Results show an increase in laminar burning velocity with initial temperature and proportion of dissociated methanol but a decrease with initial pressure and proportion of nitrogen. The laminar burning velocities were 25.1 cm/s, 38.7 cm/s and 83.2 cm/s respectively at stoichiometric ratio when the proportions of the dissociated methanol were 0%, 40% and 80%. Adding more dissociated methanol tends to shift the peak burning velocity towards the richer side while adding nitrogen has the opposite effect. More dissociated methanol will lead to earlier cellularity
Model validation for a noninvasive arterial stenosis detection problem
Copyright @ 2013 American Institute of Mathematical SciencesA current thrust in medical research is the development of a non-invasive method for detection, localization, and characterization of an arterial stenosis (a blockage or partial blockage in an artery). A method has been proposed to detect shear waves in the chest cavity which have been generated by disturbances in the blood flow resulting from a stenosis. In order to develop this methodology further, we use both one-dimensional pressure and shear wave experimental data from novel acoustic phantoms to validate corresponding viscoelastic mathematical models, which were developed in a concept paper [8] and refined herein. We estimate model parameters which give a good fit (in a sense to be precisely defined) to the experimental data, and use asymptotic error theory to provide confidence intervals for parameter estimates. Finally, since a robust error model is necessary for accurate parameter estimates and confidence analysis, we include a comparison of absolute and relative models for measurement error.The National Institute of Allergy and Infectious Diseases, the Air Force Office of Scientific Research, the Deopartment of Education and the Engineering and Physical Sciences Research Council (EPSRC)
A mathematical model for breath gas analysis of volatile organic compounds with special emphasis on acetone
Recommended standardized procedures for determining exhaled lower respiratory
nitric oxide and nasal nitric oxide have been developed by task forces of the
European Respiratory Society and the American Thoracic Society. These
recommendations have paved the way for the measurement of nitric oxide to
become a diagnostic tool for specific clinical applications. It would be
desirable to develop similar guidelines for the sampling of other trace gases
in exhaled breath, especially volatile organic compounds (VOCs) which reflect
ongoing metabolism. The concentrations of water-soluble, blood-borne substances
in exhaled breath are influenced by: (i) breathing patterns affecting gas
exchange in the conducting airways; (ii) the concentrations in the
tracheo-bronchial lining fluid; (iii) the alveolar and systemic concentrations
of the compound. The classical Farhi equation takes only the alveolar
concentrations into account. Real-time measurements of acetone in end-tidal
breath under an ergometer challenge show characteristics which cannot be
explained within the Farhi setting. Here we develop a compartment model that
reliably captures these profiles and is capable of relating breath to the
systemic concentrations of acetone. By comparison with experimental data it is
inferred that the major part of variability in breath acetone concentrations
(e.g., in response to moderate exercise or altered breathing patterns) can be
attributed to airway gas exchange, with minimal changes of the underlying blood
and tissue concentrations. Moreover, it is deduced that measured end-tidal
breath concentrations of acetone determined during resting conditions and free
breathing will be rather poor indicators for endogenous levels. Particularly,
the current formulation includes the classical Farhi and the Scheid series
inhomogeneity model as special limiting cases.Comment: 38 page
- …